109 research outputs found

    A decentralised neural model explaining optimal integration of navigational strategies in insects

    Get PDF
    Insect navigation arises from the coordinated action of concurrent guidance systems but the neural mechanisms through which each functions, and are then coordinated, remains unknown. We propose that insects require distinct strategies to retrace familiar routes (route-following) and directly return from novel to familiar terrain (homing) using different aspects of frequency encoded views that are processed in different neural pathways. We also demonstrate how the Central Complex and Mushroom Bodies regions of the insect brain may work in tandem to coordinate the directional output of different guidance cues through a contextually switched ring-attractor inspired by neural recordings. The resultant unified model of insect navigation reproduces behavioural data from a series of cue conflict experiments in realistic animal environments and offers testable hypotheses of where and how insects process visual cues, utilise the different information that they provide and coordinate their outputs to achieve the adaptive behaviours observed in the wild

    A unified neural model explaining optimal multi-guidance coordination in insect navigation

    Get PDF
    The robust navigation of insects arises from the coordinated action of concurrently functioning and interacting guidance systems. Computational models of specific brain regions can account for isolated behaviours such as path integration or route following, but the neural mechanisms by which their outputs are coordinated remains unknown. In this work, a functional modelling approach was taken to identify and model the elemental guidance subsystems required by homing insects. Then we produced realistic adaptive behaviours by integrating different guidance's outputs in a biologically constrained unified model mapped onto identified neural circuits. Homing paths are quantitatively and qualitatively compared with real ant data in a series of simulation studies replicating key infield experiments. Our analysis reveals that insects require independent visual homing and route following capabilities which we show can be realised by encoding panoramic skylines in the frequency domain, using image processing circuits in the optic lobe and learning pathways through the Mushroom Bodies (MB) and Anterior Optic Tubercle (AOTU) to Bulb (BU) respectively before converging in the Central Complex (CX) steering circuit. Further, we demonstrate that a ring attractor network inspired by firing patterns recorded in the CX can optimally integrate the outputs of path integration and visual homing systems guiding simulated ants back to their familiar route, and a simple non-linear weighting function driven by the output of the MB provides a context-dependent switch allowing route following strategies to dominate and the learned route retraced back to the nest when familiar terrain is encountered. The resultant unified model of insect navigation reproduces behavioural data from a series of cue conflict experiments in realistic animal environments and offers testable hypotheses of where and how insects process visual cues, utilise the different information that they provide and coordinate their outputs to achieve the adaptive behaviours observed in the wild. These results forward the case for a distributed architecture of the insect navigational toolkit. This unified model then be further validated by modelling the olfactory navigation of flies and ants. With simple adaptions of the sensory inputs, this model reproduces the main characteristics of the observed behavioural data, further demonstrating the useful role played by sensory-processing to CX to motor pathway in generating context-dependent coordination behaviours. In addition, this model help to complete the unified model of insect navigation by adding the olfactory cues that is one of the most crucial cues for insects

    A Versatile Vision-Pheromone-Communication Platform for Swarm Robotics

    Get PDF
    This paper describes a versatile platform for swarm robotics research. It integrates multiple pheromone communication with a dynamic visual scene along with real time data transmission and localization of multiple-robots. The platform has been built for inquiries into social insect behavior and bio-robotics. By introducing a new research scheme to coordinate olfactory and visual cues, it not only complements current swarm robotics platforms which focus only on pheromone communications by adding visual interaction, but also may fill an important gap in closing the loop from bio-robotics to neuroscience. We have built a controllable dynamic visual environment based on our previously developed ColCOSĪ¦\Phi (a multi-pheromones platform) by enclosing the arena with LED panels and interacting with the micro mobile robots with a visual sensor. In addition, a wireless communication system has been developed to allow transmission of real-time bi-directional data between multiple micro robot agents and a PC host. A case study combining concepts from the internet of vehicles (IoV) and insect-vision inspired model has been undertaken to verify the applicability of the presented platform, and to investigate how complex scenarios can be facilitated by making use of this platform

    ColCOSĪ¦: A Multiple Pheromone Communication System for Swarm Robotics and Social Insects Research

    Get PDF
    In the last few decades we have witnessed how the pheromone of social insect has become a rich inspiration source of swarm robotics. By utilising the virtual pheromone in physical swarm robot system to coordinate individuals and realise direct/indirect inter-robot communications like the social insect, stigmergic behaviour has emerged. However, many studies only take one single pheromone into account in solving swarm problems, which is not the case in real insects. In the real social insect world, diverse behaviours, complex collective performances and ļ¬‚exible transition from one state to another are guided by different kinds of pheromones and their interactions. Therefore, whether multiple pheromone based strategy can inspire swarm robotics research, and inversely how the performances of swarm robots controlled by multiple pheromones bring inspirations to explain the social insectsā€™ behaviours will become an interesting question. Thus, to provide a reliable system to undertake the multiple pheromone study, in this paper, we speciļ¬cally proposed and realised a multiple pheromone communication system called ColCOSĪ¦. This system consists of a virtual pheromone sub-system wherein the multiple pheromone is represented by a colour image displayed on a screen, and the micro-robots platform designed for swarm robotics applications. Two case studies are undertaken to verify the effectiveness of this system: one is the multiple pheromone based on an antā€™s forage and another is the interactions of aggregation and alarm pheromones. The experimental results demonstrate the feasibility of ColCOSĪ¦ and its great potential in directing swarm robotics and social insects research

    Affordance-Driven Next-Best-View Planning for Robotic Grasping

    Full text link
    Grasping occluded objects in cluttered environments is an essential component in complex robotic manipulation tasks. In this paper, we introduce an AffordanCE-driven Next-Best-View planning policy (ACE-NBV) that tries to find a feasible grasp for target object via continuously observing scenes from new viewpoints. This policy is motivated by the observation that the grasp affordances of an occluded object can be better-measured under the view when the view-direction are the same as the grasp view. Specifically, our method leverages the paradigm of novel view imagery to predict the grasps affordances under previously unobserved view, and select next observation view based on the highest imagined grasp quality of the target object. The experimental results in simulation and on a real robot demonstrate the effectiveness of the proposed affordance-driven next-best-view planning policy. Project page: https://sszxc.net/ace-nbv/.Comment: Conference on Robot Learning (CoRL) 202

    Unsupervised image saliency detection with Gestalt-laws guided optimization and visual attention based refinement.

    Get PDF
    Visual attention is a kind of fundamental cognitive capability that allows human beings to focus on the region of interests (ROIs) under complex natural environments. What kind of ROIs that we pay attention to mainly depends on two distinct types of attentional mechanisms. The bottom-up mechanism can guide our detection of the salient objects and regions by externally driven factors, i.e. color and location, whilst the top-down mechanism controls our biasing attention based on prior knowledge and cognitive strategies being provided by visual cortex. However, how to practically use and fuse both attentional mechanisms for salient object detection has not been sufficiently explored. To the end, we propose in this paper an integrated framework consisting of bottom-up and top-down attention mechanisms that enable attention to be computed at the level of salient objects and/or regions. Within our framework, the model of a bottom-up mechanism is guided by the gestalt-laws of perception. We interpreted gestalt-laws of homogeneity, similarity, proximity and figure and ground in link with color, spatial contrast at the level of regions and objects to produce feature contrast map. The model of top-down mechanism aims to use a formal computational model to describe the background connectivity of the attention and produce the priority map. Integrating both mechanisms and applying to salient object detection, our results have demonstrated that the proposed method consistently outperforms a number of existing unsupervised approaches on five challenging and complicated datasets in terms of higher precision and recall rates, AP (average precision) and AUC (area under curve) values

    Investigating Multiple Pheromones in Swarm Robots - A Case Study of Multi-Robot Deployment

    Get PDF
    Social insects are known as the experts in handling complex task in a collective smart way although their small brains contain only limited computation resources and sensory information. It is believed that pheromones play a vital role in shaping social insects' collective behaviours. One of the key points underlying the stigmergy is the combination of different pheromones in a specific task. In the swarm intelligence field, pheromone inspired studies usually focus one single pheromone at a time, so it is not clear how effectively multiple pheromones could be employed for a collective strategy in the real physical world. In this study, we investigate multiple pheromone-based deployment strategy for swarm robots inspired by social insects. The proposed deployment strategy uses two kinds of artificial pheromones; the attractive and the repellent pheromone that enables micro robots to be distributed in desired positions with high efficiency. The strategy is assessed systematically by both simulation and real robot experiments using a novel artificial pheromone platform ColCOSĪ¦. Results from the simulation and real robot experiments both demonstrate the effectiveness of the proposed strategy and reveal the role of multiple pheromones. The feasibility of the ColCOSĪ¦ platform, and its potential for further robotic research on multiple pheromones are also verified. Our study of using different pheromones for one collective swarm robotics task may help or inspire biologists in real insects' research
    • ā€¦
    corecore